-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enables reload of dataloaders on every n epochs from every epoch #5043
Conversation
you need to deprecate Also maybe rename And ofcourse pls add a test too. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ! Minor changes and a test.
Codecov Report
@@ Coverage Diff @@
## master #5043 +/- ##
=======================================
- Coverage 92% 88% -5%
=======================================
Files 213 213
Lines 13798 13809 +11
=======================================
- Hits 12744 12132 -612
- Misses 1054 1677 +623 |
@rohitgr7 after looking at |
ah.. I was just looking at these two arguments so though |
Hello @sid-sundrani! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-07-07 01:18:59 UTC |
@rohitgr7 the warning doesn't print for some reason if I use
should I do the former? |
Can you add the warning at the correct place where it should be and a failing test? Will check it. @sid-sundrani |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's make it an int only, no bool. Default to 0. since other similar parameters check_val_every_n_epoch
, log_every_n_steps
, ... doesn't accept a bool value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you also need to update the docs here: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/core/hooks.py
Also this test too: https://github.com/PyTorchLightning/pytorch-lightning/blob/eb9cb3c8017b1f207ac1d2c7168327fbe0b63cb5/tests/core/test_datamodules.py#L437
these docs here too: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/docs/source/trainer.rst
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. I'm just wondering if someone sets limit_train/val/test_batches=0
should it even load the dataloaders at all??
@sid-sundrani pls mark it as ready for review if it's ready. |
@sid-sundrani any chance you can rebase and address final comments? we would love to finally merge! |
Pull request was closed
What does this PR do?
This PR extends the logic of reloading dataloaders on every epoch allowing DLs to be reloaded every n epochs. It changes the argument from
reload_dataloaders_every_epoch
toreload_dataloaders_every_n_epochs
. The parameter accepts anint
>= 0. The default value is 0Fixes #5001
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃